Growing Layers of Perceptrons : Introducing the Extentron Algorithm

نویسنده

  • John M. Zelle
چکیده

The ideas presented here are based on two observations of perceptrons: (1) when the perceptron learning algorithm cycles among hyperplanes, the hyperplanes may be compared to select one that gives a best split of the examples, and (2) it is always possible for the perceptron to build a hyper-plane that separates at least one example from all the rest. We describe the Extentron which grows multi-layer networks capable of distinguishing non-linearly-separable data using the simple perceptron rule for linear threshold units. The resulting algorithm is simple, very fast, scales well to large problems , retains the convergence properties of the per-ceptron, and can be completely specied using only two parameters. Results are presented comparing the Extentron to other neural network paradigms and to symbolic learning systems.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling of measurement error in refractive index determination of fuel cell using neural network and genetic algorithm

Abstract: In this paper, a method for determination of refractive index in membrane of fuel cell on basis of three-longitudinal-mode laser heterodyne interferometer is presented. The optical path difference between the target and reference paths is fixed and phase shift is then calculated in terms of refractive index shift. The measurement accuracy of this system is limited by nonlinearity erro...

متن کامل

4 . Multilayer perceptrons and back - propagation

Multilayer feed-forward networks, or multilayer perceptrons (MLPs) have one or several " hidden " layers of nodes. This implies that they have two or more layers of weights. The limitations of simple perceptrons do not apply to MLPs. In fact, as we will see later, a network with just one hidden layer can represent any Boolean function (including the XOR which is, as we saw, not linearly separab...

متن کامل

Growing subspace pattern recognition methods and their neural-network models

In statistical pattern recognition, the decision of which features to use is usually left to human judgment. If possible, automatic methods are desirable. Like multilayer perceptrons, learning subspace methods (LSMs) have the potential to integrate feature extraction and classification. In this paper, we propose two new algorithms, along with their neural-network implementations, to overcome ce...

متن کامل

A fuzzy rule-based algorithm to train perceptrons

In this paper, a method to train perceptrons using fuzzy rules is presented. The fuzzy rules linguistically describe how to upgrade the weights as well as to state the desired output of the neurons of the hidden layers. The version for networks with one hidden layer is carefully described and illustrated with examples. c © 2001 Elsevier Science B.V. All rights reserved.

متن کامل

Phrase recognition by filtering and ranking with perceptrons

We present a phrase recognition system based on perceptrons, and an online learning algorithm to train them together. The recognition strategy applies learning in two layers, first at word level, to filter words and form phrase candidates, second at phrase level, to rank phrases and select the optimal ones. We provide a global feedback rule which reflects the dependencies among perceptrons and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1992